4,886 research outputs found

    A quantum leap in informal benchmarking : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Organisational Excellence at Massey University, Palmerston North, Manawatu, New Zealand

    Get PDF
    Despite the paucity of available literature on informal benchmarking and the consequential lack of its understanding, informal benchmarking has outranked established benchmarking (formal), placing 4th out of 20 of the most used business improvement tools, based on a 2008 Global Benchmarking Network (GBN) survey of 450 organisations worldwide. This paradox is exacerbated by the growing popularity of informal benchmarking, even though it is not correspondingly as effective as it is widely used. Therefore, two significant gaps need be filled: firstly, to develop a theoretical understanding of, and secondly, to investigate how to increase the effectiveness of informal benchmarking as an organisational improvement tool. A pragmatic mixed method quantitative-qualitative sequential design using an abductivedeductive- inductive approach is adopted. The product of abduction is a preliminary conceptual model of informal benchmarking from the transdisciplinary academic review of benchmarking, informal learning, organisation learning and knowledge management, augmented by concepts on quantum thinking, innovation and positive deviance. The model informs the quantitative survey questionnaire, whose deductive results of 81 survey responses from 14 countries informs the in-depth semi-structured interviews of 16 informants from 7 countries, the resulting dataset being inductively coded into conceptuallydriven dendrograms. The integrated findings refine the conceptual model of informal benchmarking, and develops a toolset-based application model (a pragmatic outcome of the conceptual model), a maturity assessment framework and an eco-system strategy. From here, an informal benchmarking roadmap is synthesised, representing a sustainable platform for informal benchmarking to be deployed as an effective organisational improvement initiative. The research sets the stage for a leap in scholarly understanding of informal benchmarking in the wider context of business and organisational improvement, and offers organisational improvement practitioners an invaluable cost-effective solution in a time-scarce executive world. This pragmatic study of informal benchmarking has possibly unleashed a different epistemological stance within the benchmarking field, by advocating an organic approach to benchmarking, in contrast to the highly methodical approaches associated with conventional benchmarking

    mplot: An R Package for Graphical Model Stability and Variable Selection Procedures

    Get PDF
    The mplot package provides an easy to use implementation of model stability and variable inclusion plots (M\"uller and Welsh 2010; Murray, Heritier, and M\"uller 2013) as well as the adaptive fence (Jiang, Rao, Gu, and Nguyen 2008; Jiang, Nguyen, and Rao 2009) for linear and generalised linear models. We provide a number of innovations on the standard procedures and address many practical implementation issues including the addition of redundant variables, interactive visualisations and approximating logistic models with linear models. An option is provided that combines our bootstrap approach with glmnet for higher dimensional models. The plots and graphical user interface leverage state of the art web technologies to facilitate interaction with the results. The speed of implementation comes from the leaps package and cross-platform multicore support.Comment: 28 pages, 9 figure

    The effect of an annual cull on the sett usage patterns of the Eurasian Badger

    Get PDF
    The effect of an annual cull on the sett usage patterns of the Eurasian Badger

    Recordkeeping in a small nonprofit organization

    Get PDF

    MDCC: Multi-Data Center Consistency

    Get PDF
    Replicating data across multiple data centers not only allows moving the data closer to the user and, thus, reduces latency for applications, but also increases the availability in the event of a data center failure. Therefore, it is not surprising that companies like Google, Yahoo, and Netflix already replicate user data across geographically different regions. However, replication across data centers is expensive. Inter-data center network delays are in the hundreds of milliseconds and vary significantly. Synchronous wide-area replication is therefore considered to be unfeasible with strong consistency and current solutions either settle for asynchronous replication which implies the risk of losing data in the event of failures, restrict consistency to small partitions, or give up consistency entirely. With MDCC (Multi-Data Center Consistency), we describe the first optimistic commit protocol, that does not require a master or partitioning, and is strongly consistent at a cost similar to eventually consistent protocols. MDCC can commit transactions in a single round-trip across data centers in the normal operational case. We further propose a new programming model which empowers the application developer to handle longer and unpredictable latencies caused by inter-data center communication. Our evaluation using the TPC-W benchmark with MDCC deployed across 5 geographically diverse data centers shows that MDCC is able to achieve throughput and latency similar to eventually consistent quorum protocols and that MDCC is able to sustain a data center outage without a significant impact on response times while guaranteeing strong consistency

    Thomas Glassey : Queensland Labour leader

    Get PDF

    Assessing Nuclear Proliferation by Using System Dynamics Modeling

    Get PDF
    The goal of this project is to understand the influence of social and cultural factors on nuclear proliferation. We identified factors that contribute to a country’s motivation to initiate a nuclear weapons program from political science literature, obtained relevant social and cultural information, and developed a system dynamics model. System dynamics is used to understand complex interactive systems with feedback. The modeling process began with the construction of a causal loop diagram, which contains the essential elements that account for nuclear proliferation and relationships between these elements. The relationships between the elements are represented by arrows that are labeled either positively or negatively to show their causal relationship. A positive sign represents a direct relationship and a minus sign represents an indirect or inverse relationship. The causal loop diagram is a mental model used to construct a stock and flow simulation model. This stock and flow model can be quantified with equations that capture the relationships amongst the elements of the system. Through system dynamics we were able to interpret the levels of variables which indicate how the system changes and gives us insight as to how each variable affects proliferation. The software used for our project was VenSim, by Ventana Systems, which enabled us to create a system dynamics model including our causal loop diagram and then place relevant variables into our stock and flow diagram. We are in the process of completing our stock and flow diagram, which will help us to gain a better understanding of the motivations for state–level nuclear proliferation. Proliferation assessment has an immense amount of factors to take into account. Through preliminary simulations of the model we demonstrated the impact of autocratic versus democratic governments on motivation to proliferate which is mediated by the differential levels of integration that results from economic trade. The model allows for expansion and lays the foundation for further investigation
    • …
    corecore